This paper studies the subspace segmentation problem. Given a set of datapoints drawn from a union of subspaces, the goal is to partition them intotheir underlying subspaces they were drawn from. The spectral clustering methodis used as the framework. It requires to find an affinity matrix which is closeto block diagonal, with nonzero entries corresponding to the data point pairsfrom the same subspace. In this work, we argue that both sparsity and thegrouping effect are important for subspace segmentation. A sparse affinitymatrix tends to be block diagonal, with less connections between data pointsfrom different subspaces. The grouping effect ensures that the highly correcteddata which are usually from the same subspace can be grouped together. SparseSubspace Clustering (SSC), by using $\ell^1$-minimization, encourages sparsityfor data selection, but it lacks of the grouping effect. On the contrary,Low-Rank Representation (LRR), by rank minimization, and Least SquaresRegression (LSR), by $\ell^2$-regularization, exhibit strong grouping effect,but they are short in subset selection. Thus the obtained affinity matrix isusually very sparse by SSC, yet very dense by LRR and LSR. In this work, we propose the Correlation Adaptive Subspace Segmentation(CASS) method by using trace Lasso. CASS is a data correlation dependent methodwhich simultaneously performs automatic data selection and groups correlateddata together. It can be regarded as a method which adaptively balances SSC andLSR. Both theoretical and experimental results show the effectiveness of CASS.
展开▼